ICFP contest in progress!

On the off-chance you hadn't realized it, the 2007 ICFP contest has started today. Join the fun!

AngloHaskell 2007 - date and venue confirmed

The date and venue for AngloHaskell 2007 have been finalised and announced:

We are pleased to announce AngloHaskell 2007

http://www.haskell.org/haskellwiki/AngloHaskell

Dates: 10th-11th of August (Friday-Saturday)
Location: Cambridge, with talks at Microsoft Research on Friday

All the details are on the wiki page, along with free registration.
Everyone is invited, we will have a day of talks at MSR, then a day of
other activities. There will be plenty of chance for general
discussions on anything.

If anyone in Cambridge is able to accommodate a few people for the
Friday or Saturday night, please add your name to the wiki, and accept
our thanks in advance. All that is needed is floor space.

Thanks

Neil and Philippa

LtUers are all welcome - especially anyone who wants to give a talk! As Neil put it in a previous mail:

This is NOT an academic conference. Everyone is welcome to attend, there is no fee. Everyone is invited to offer a talk.

Practical talks are particularly welcome. I'll be giving a talk on my experiences with Haskell as a person with Asperger's Syndrome.

A Natural Axiomatization of Church's Thesis

A Natural Axiomatization of Church's Thesis. Nachum Dershowitz and Yuri Gurevich. July 2007.

The Abstract State Machine Thesis asserts that every classical algorithm is behaviorally equivalent to an abstract state machine. This thesis has been shown to follow from three natural postulates about algorithmic computation. Here, we prove that augmenting those postulates with an additional requirement regarding basic operations implies Church's Thesis, namely, that the only numeric functions that can be calculated by effective means are the recursive ones (which are the same, extensionally, as the Turing-computable numeric functions). In particular, this gives a natural axiomatization of Church's Thesis, as Gödel and others suggested may be possible.

While not directly dealing with programming languages, I still think this paper might be of interest, since our field (and our discussions) are often concerned with computability (or effective computation, if you prefer).

The idea the Church's Thesis can be proven is not new, and indeed there seems to be a cottage industry devoted to this issue (for a quick glance search the paper for references to Gandy and work related to his).

Even if the discussion of ASMs is not your cup of tea, it seems like a good idea to keep in mind the distinctions elaborated in the conclusions section between "Thesis M" (the "Physical C-T Thesis"), the "AI Thesis", and the "standard" meaning of the C-T thesis.

Another quote (from the same section) that may prove useful in future LtU discussions: We should point out that, nowadays, one deals daily with more flexible notions of algorithm, such as interactive and distributed computations. To capture such non-sequential processes and non-classical algorithms, additional postulates are required.

Lambda Animator

Lambda Animator from Mike Thyer is a tool for displaying and experimenting with alternate reduction strategies in the LC. The tool can use a number of reduction strategies, including completely lazy evaluation.

The tool can be invoked in several different modes (via JWS or as an applet), and contains many examples, and the documentation provides clear definitions of the sometime confusing terminology in the field.

Notice that the "step" and "run" buttons only work when rendering new graphs, which only works if you are running in trusted mode and have Graphviz installed. Otherwise you'll have to use the the up/down cursor keys or the scroll bar to review already rendered graphs (which exist for all the examples).

The site list relevant papers and dissertations.

HOPL-III: Statecharts in the Making

Another HOPL-III paper: Statecharts in the Making: A Personal Account by David Harel. This paper reads much different than most of the others, as the author admits to being mostly an accidental PL designer - backing into it from a mathematical perspective.

This paper is a highly personal and subjective account of how the language of statecharts came into being. The main novelty of the language is in being a fully executable visual formalism intended for capturing the behavior of complex real-world systems, and an interesting aspect of its history is that it illustrates the advantages of theoreticians venturing out into the trenches of the real world, "dirtying their hands" and working closely with the system's engineers. The story is told in a way that puts statecharts into perspective and discusses the role of the language in the emergence of broader concepts, such as visual formalisms in general, reactive systems, model-driven development, model executability and code generation.

The Statecharts language arose from the domain of avionics and real-time state modeling. The author's main goal was to turn what were visual doodles into executable models - finite-state-automata. Both UML and Rhapsody use parts of the Statecharts engine. The paper provides a good background for the subject of visual programming languages - a topic that periodically crops up on LtU. I found the emphasis on topology, as opposed to geometry, as the mathematical basis of visual programming to be of interest (though perhaps obvious to those who are more familiar with the subject):

When it comes to visuality, encapsulation and side-by-side adjacency are topological notions, just like edge connectivity, and are therefore worthy companions to edges in hierarchical extensions of graphs. Indeed, I believe that topology should be used first when designing a graphical language and only then one should move on to geometry. Topological features are a lot more fundamental than geometric ones, in that topology is a more basic branch of mathematics than geometry in terms of symmetries and mappings. One thing being inside another is more basic than it being smaller or larger than the other, or than one being a rectangle and the other a circle. Being connected to something is more basic than being green or yellow or being drawn with a thick line or with a thin line. I think the brain understands topological features given visually much better than it grasps geometrical ones. The mind can see easily and immediately whether things are connected or not, whether one thing encompasses another, or intersects it, etc.

Provides a nice refutation for the recent brouhaha of those who think math is irrelevant for process modeling - a solid mathematical foundation is even more critical for languages that concentrate on expression in unique fashions.
(Previous LtU links to HOPL-III papers)

Simply Easy! (An Implementation of a Dependently Typed Lambda Calculus)

Simply Easy! (An Implementation of a Dependently Typed Lambda Calculus)
Andres Löh, Conor McBride and Wouter Swierstra

We present an implementation in Haskell of a dependently-typed lambda calculus that can be used as the core of a programming language. We show that a dependently-typed lambda calculus is no more difficult to implement than other typed lambda calculi. In fact, our implementation is almost as easy as an implementation of the simply typed lambda calculus, which we emphasize by discussing the modifications necessary to go from one to the other. We explain how to add data types and write simple programs in the core language, and discuss the steps necessary to build a full-fledged programming language on top of our simple core.

The PIllars of Concurrency

Herb Sutter, The Pillars of Concurrency, Dr. Dobb's Journal, 2007-07-02

In his inaugural column, Herb makes the case that we must build a consistent mental model before talking about concurrency.

When I first read the abstract, I assumed that Sutter meant a mental model along the lines of the pi-calculus. But it turns out what he was talking about has more to do with outlining the fundamental things for which concurrency is used, what requirements those things imply, and what kind of solutions can be used to meet the requirements. Nothing startlingly new, although the overview Sutter provides is interesting.

From a language design perspective, it might be interesting to consider which of the "pillars" that Sutter identifies are supported by a given language, how well, and why the language developed that way. Many of the arguments about the "best" way to handle concurrency may simply boil down to differences in the application domains being considered.

Also interesting to note is that much of this column was inspired by work by David Callahan on concurrency support for Visual Studio, so the things that Sutter discusses provide some insight into what Microsoft is thinking about for future development products.

Theorem proving support in programming language semantics

Yves Bertot, one of the coauthors of the Coq'Art book, has made available a tech report on Theorem proving support in programming language semantics, which describes how Coq can be used to support an exhasutive "views" approach to program semantics, using a toy imperative language from Glyn Winskel's book, The Formal Semantics of Programming Languages, and specifying a family of kinds of semantics on it, following Winskel's treatment.

Bertot follows the work of Tobias Nipkow, in Winskel is (almost) Right: Towards a Mechanized Semantics Textbook using Isabelle, but he shows how Coq's support for reflection allows this treatment to be carried out in a lean, condensed manner. The 23 page paper covers Plotkin-style SOS semantics, Kahn's natural semantics, Dijkstra's weakest precondition semantics, a simple CPO denotational semantics that I assume is due to Winskel & finally uses the denotational model as a basis for abstract interpretation.

Multiscale Scheduling, Integrating Competitive and Cooperative Parallelism in Theory and in Practice

Multiscale Scheduling, Integrating Competitive and Cooperative Parallelism in Theory and in Practice.
G. E. Blelloch, L. Blum, M. Harchol-Balter, and R. Harper. February, 2007.

Our view is that the computing systems of the future exhibit parallelism multiple levels of
granularity, and that these resources must be shared among many users simultaneously. These combination of these two aspects, which we call multiscale parallelism, poses significant challenges for implementing next-generation systems...

We believe that the theoretical issues of multiscale scheduling cannot be properly addressed without carefully considering how these issues will work with particular applications and how they coordinate with the programming languages used to express the parallelism.

This proposed long-term research project (which as far as I can tell was not mentioned here before) is interesting on many levels. From a programming languages perspective the main idea seems to be extending the NESL language, which is based on data parallelism, with such concepts as speculative parallelism that mesh well with multilevel scheduling.

The Evolution Of LINQ And Its Impact On The Design Of C#

MSDN Magazine, June 2007

"The language design team now had several prototypes to get feedback on. So we organized a usability study with many participants who had experience with both C# and SQL. The feedback was almost universally positive, but it was clear there was something missing. In particular, it was difficult for the developers to apply their knowledge of SQL because the syntax we thought was ideal didn’t map very well to their domain expertise."